AIbase
Home
AI Tools
AI Models
MCP
AI NEWS
EN
Model Selection
Tags
Supercomputing training

# Supercomputing training

Bielik 4.5B V3.0 Instruct
Apache-2.0
Bielik-4.5B-v3-Instruct is a 4.6 billion parameter Polish generative text model, fine-tuned based on Bielik-4.5B-v3, demonstrating exceptional Polish language comprehension and processing capabilities.
Large Language Model Transformers Other
B
speakleash
1,121
13
Bielik 11B V2.3 Instruct
Apache-2.0
Bielik-11B-v2.3-Instruct is a generative text model with 11 billion parameters, specifically designed for Polish, developed by SpeakLeash in collaboration with ACK Cyfronet AGH.
Large Language Model Transformers Other
B
speakleash
29.32k
51
Bielik 11B V2
Apache-2.0
Bielik-11B-v2 is a generative text model with 11 billion parameters, specifically developed and trained for Polish language text. It is initialized based on Mistral-7B-v0.2 and trained on 400 billion tokens.
Large Language Model Transformers Other
B
speakleash
690
40
Featured Recommended AI Models
AIbase
Empowering the Future, Your AI Solution Knowledge Base
English简体中文繁體中文にほんご
© 2025AIbase